Stochastic projection based approach for gradient free physics informed learning

نویسندگان

چکیده

We propose a stochastic projection-based gradient free physics-informed neural network. The proposed approach, referred to as the projection based physics informed network (SP-PINN), blends upscaled theory with recently This results in framework that is robust and can solve problems involving complex solution domain discontinuities. SP-PINN gradient-free approach which addresses computational bottleneck associated automatic differentiation conventional PINN. Efficacy of illustrated by number examples regular domain, response phase field fracture mechanics problems. Case studies varying architecture (activation function) collocation points have also been presented.

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

A Novel Gradient Projection Approach for Fourier-Based Image Restoration

This work deals with the ill-posed inverse problem of reconstructing a two-dimensional image of an unknown object starting from sparse and nonuniform measurements of its Fourier Transform. In particular, if we consider a priori information about the target image (e.g., the nonnegativity of the pixels), this inverse problem can be reformulated as a constrained optimization problem, in which the ...

متن کامل

Gradient Projection Learning for Parametric Nonrigid Registration

A potentially large anatomical variability among subjects in a population makes nonrigid image registration techniques prone to inaccuracies and to high computational costs in their optimisation. In this paper, we propose a new learning-based approach to accelerate the convergence rate of any chosen parametric energy-based image registration method. From a set of training images and their corre...

متن کامل

Stochastic Gradient Descent with Only One Projection

Although many variants of stochastic gradient descent have been proposed for large-scale convex optimization, most of them require projecting the solution at each iteration to ensure that the obtained solution stays within the feasible domain. For complex domains (e.g., positive semidefinite cone), the projection step can be computationally expensive, making stochastic gradient descent unattrac...

متن کامل

Hogwild: A Lock-Free Approach to Parallelizing Stochastic Gradient Descent

Stochastic Gradient Descent (SGD) is a popular algorithm that can achieve stateof-the-art performance on a variety of machine learning tasks. Several researchers have recently proposed schemes to parallelize SGD, but all require performancedestroying memory locking and synchronization. This work aims to show using novel theoretical analysis, algorithms, and implementation that SGD can be implem...

متن کامل

Projection-free Online Learning

The computational bottleneck in applying online learning to massive data sets is usually the projection step. We present efficient online learning algorithms that eschew projections in favor of much more efficient linear optimization steps using the Frank-Wolfe technique. We obtain a range of regret bounds for online convex optimization, with better bounds for specific cases such as stochastic ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Computer Methods in Applied Mechanics and Engineering

سال: 2023

ISSN: ['0045-7825', '1879-2138']

DOI: https://doi.org/10.1016/j.cma.2022.115842